首页> 外文OA文献 >A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks
【2h】

A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks

机译:对数函数,线性函数和指数函数之间的连续统一,和   它有可能改善神经网络的泛化

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。
获取外文期刊封面目录资料

摘要

We present the soft exponential activation function for artificial neuralnetworks that continuously interpolates between logarithmic, linear, andexponential functions. This activation function is simple, differentiable, andparameterized so that it can be trained as the rest of the network is trained.We hypothesize that soft exponential has the potential to improve neuralnetwork learning, as it can exactly calculate many natural operations thattypical neural networks can only approximate, including addition,multiplication, inner product, distance, polynomials, and sinusoids.
机译:我们提出了用于人工神经网络的软指数激活函数,该函数在对数,线性和指数函数之间连续插值。此激活函数简单,可区分且可参数化,因此可以在训练其余网络时对其进行训练。我们假设软指数具有改善神经网络学习的潜力,因为它可以准确地计算出典型神经网络仅能执行的许多自然运算近似值,包括加法,乘法,内积,距离,多项式和正弦曲线。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号